Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: multiple endpoints using a list of LitServer #276

Draft
wants to merge 39 commits into
base: main
Choose a base branch
from

Conversation

bhimrazy
Copy link
Contributor

@bhimrazy bhimrazy commented Sep 12, 2024

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

⚠️ How does this PR impact the user? ⚠️

As a user, I want to host multiple endpoints for different purposes, such as serving an embedding API, prediction API, etc., on the same server while maintaining LitServer features.

What does this PR do?

Fixes #271.

Usage

# server.py
from litserve.server import LitServer, run_all
from litserve.test_examples import SimpleLitAPI


class SimpleLitAPI1(SimpleLitAPI):
    def setup(self, device):
        self.model = lambda x: x**1


class SimpleLitAPI2(SimpleLitAPI):
    def setup(self, device):
        self.model = lambda x: x**2


class SimpleLitAPI3(SimpleLitAPI):
    def setup(self, device):
        self.model = lambda x: x**3


class SimpleLitAPI4(SimpleLitAPI):
    def setup(self, device):
        self.model = lambda x: x**4


if __name__ == "__main__":
    server1 = LitServer(SimpleLitAPI1(), api_path="/predict-1")
    server2 = LitServer(SimpleLitAPI2(), api_path="/predict-2")
    server3 = LitServer(SimpleLitAPI3(), api_path="/predict-3")
    server4 = LitServer(SimpleLitAPI4(), api_path="/predict-4")
    run_all([server1, server2, server3, server4], port=8000)
# client.py
import requests

for i in range(1, 5):
    resp = requests.post(f"http://127.0.0.1:8000/predict-{i}", json={"input": 4.0}, headers=None)
    assert resp.status_code == 200, f"Expected response to be 200 but got {resp.status_code}"
    assert resp.json() == {"output": 4.0**i}, f"Expected response to be {4.0**i} but got {resp.json()}"

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in GitHub issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

Copy link

codecov bot commented Sep 12, 2024

Codecov Report

Attention: Patch coverage is 98.43750% with 1 line in your changes missing coverage. Please review.

Project coverage is 95%. Comparing base (f475369) to head (e5db967).

Additional details and impacted files
@@         Coverage Diff         @@
##           main   #276   +/-   ##
===================================
  Coverage    95%    95%           
===================================
  Files        14     14           
  Lines      1082   1143   +61     
===================================
+ Hits       1025   1085   +60     
- Misses       57     58    +1     

Copy link
Collaborator

@aniketmaurya aniketmaurya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome @bhimrazy!! We might have to look into mounting FastAPI app instances like here so that the endpoints are visible from all the uvicorn servers.

src/litserve/server.py Outdated Show resolved Hide resolved
@bhimrazy bhimrazy changed the title [WIP]: Add support for multi-endpoints Feat: Add support for multi-endpoints Sep 15, 2024
@bhimrazy bhimrazy changed the title Feat: Add support for multi-endpoints Feat: Add support for running multi-litservers in combined form Sep 17, 2024
@bhimrazy bhimrazy changed the title Feat: Add support for running multi-litservers in combined form Feat: Add support for running multiple litservers in combined form Sep 17, 2024
src/litserve/server.py Outdated Show resolved Hide resolved
@aniketmaurya aniketmaurya changed the title Feat: Add support for running multiple litservers in combined form Feat: multiple endpoints using a list of LitServer Sep 17, 2024
src/litserve/server.py Outdated Show resolved Hide resolved
@williamFalcon williamFalcon self-requested a review September 18, 2024 16:33
Copy link
Contributor

@williamFalcon williamFalcon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, let's pause this for now while we validate with customers what is a real use case...
rushing into something will only complicate the codebase, without no real user proof of exactly what problems this solves for them...

this feels very "theoretical" at this stage... i want to root this with user feedback first @lantiga @aniketmaurya

@bhimrazy bhimrazy marked this pull request as draft September 18, 2024 16:50
@aceliuchanghong
Copy link

still can't use this function

@VikramxD
Copy link

VikramxD commented Oct 2, 2024

In the meantime , this gets merged , is there any other way to run multiple endpoints in one main server ?

@williamFalcon
Copy link
Contributor

this is paused and not scheduled to be merged until we have a very clear usecase.

so, the best way to unblock this is to share code of what you are trying to do and why you wouldn’t just run two separate servers on the same machine?

@VikramxD

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Is it possible to support multiple endpoints for one server?
6 participants